DTOcean File Input / Output Examples


In [1]:
%matplotlib inline

In [2]:
from IPython.display import display, HTML

In [3]:
import matplotlib.pyplot as plt
plt.rcParams['figure.figsize'] = (14.0, 8.0)

In [4]:
import numpy as np

In [5]:
from aneris.control.factory import InterfaceFactory

In [6]:
from dtocean_core import start_logging
from dtocean_core.core import Core, AutoFileInput, AutoFileOutput
from dtocean_core.menu import DataMenu, ModuleMenu, ProjectMenu
from dtocean_core.pipeline import Tree
from dtocean_core.data import CoreMetaData

In [7]:
def html_list(x):
    message = "<ul>"
    for name in x:
        message += "<li>{}</li>".format(name)
    message += "</ul>"
    return message
def html_dict(x):
    message = "<ul>"
    for name, status in x.iteritems():
        message += "<li>{}: <b>{}</b></li>".format(name, status)
    message += "</ul>"
    return message

In [8]:
# Bring up the logger
start_logging()


2019-03-09 10:57:57,595 - INFO - dtocean_core - Begin logging for dtocean_core

Create the core, menus and pipeline tree


In [9]:
new_core = Core()
data_menu = DataMenu()
project_menu = ProjectMenu()
module_menu = ModuleMenu()
pipe_tree = Tree()

Create a new project


In [10]:
project_title = "DTOcean"  
new_project = project_menu.new_project(new_core, project_title)


2019-03-09 10:58:01,585 - INFO - aneris.entity.simulation - Created new Simulation with title "Default"
2019-03-09 10:58:01,588 - INFO - aneris.control.simulation - Datastate with level "initial" stored
2019-03-09 10:58:01,588 - INFO - aneris.control.pipeline - New Hub created for interface ProjectInterface.

Set the device type


In [11]:
options_branch = pipe_tree.get_branch(new_core, new_project, "System Type Selection")
variable_id = "device.system_type"
my_var = options_branch.get_input_variable(new_core, new_project, variable_id)
my_var.set_raw_interface(new_core, "Tidal Fixed")
my_var.read(new_core, new_project)


2019-03-09 10:58:01,611 - INFO - aneris.control.data - New "device.system_type" data stored with index 984JNG
2019-03-09 10:58:01,618 - INFO - aneris.control.simulation - Datastate stored
2019-03-09 10:58:01,618 - INFO - dtocean_core.core - Data added for identifier 'device.system_type'

Initiate the pipeline


In [12]:
project_menu.initiate_pipeline(new_core, new_project)


2019-03-09 10:58:01,637 - INFO - aneris.control.simulation - Datastate with level "system type selection start" stored
2019-03-09 10:58:01,645 - INFO - aneris.control.data - New "hidden.pipeline_active" data stored with index C9M4HJ
2019-03-09 10:58:01,648 - INFO - aneris.control.simulation - Datastate stored
2019-03-09 10:58:01,651 - INFO - dtocean_core.core - Data added for identifier 'hidden.pipeline_active'
2019-03-09 10:58:01,654 - INFO - aneris.control.pipeline - New Pipeline created for interface ModuleInterface.
2019-03-09 10:58:01,657 - INFO - aneris.control.pipeline - New Hub created for interface ThemeInterface.

Discover available modules


In [13]:
names = module_menu.get_available(new_core, new_project)
message = html_list(names)
HTML(message)


Out[13]:
  • Hydrodynamics
  • Electrical Sub-Systems
  • Mooring and Foundations
  • Installation
  • Operations and Maintenance

Activate a module


In [14]:
module_name = 'Hydrodynamics'
module_menu.activate(new_core, new_project, module_name)
hydro_branch = pipe_tree.get_branch(new_core, new_project, 'Hydrodynamics')

Initiate the dataflow


In [15]:
project_menu.initiate_dataflow(new_core, new_project)


2019-03-09 10:58:01,710 - INFO - aneris.control.data - New "hidden.dataflow_active" data stored with index 6QWKGE
2019-03-09 10:58:01,711 - INFO - aneris.control.simulation - Datastate stored
2019-03-09 10:58:01,713 - INFO - dtocean_core.core - Data added for identifier 'hidden.dataflow_active'
2019-03-09 10:58:01,729 - INFO - aneris.control.simulation - Datastate with level "modules initial" stored

Move the system to the post-filter state and ready the system


In [16]:
new_core.inspect_level(new_project, "modules initial")
new_core.reset_level(new_project, preserve_level=True)


2019-03-09 10:58:01,742 - INFO - dtocean_core.core - Inspecting level modules initial
2019-03-09 10:58:01,759 - INFO - dtocean_core.core - Inspecting level modules initial
2019-03-09 10:58:01,762 - INFO - dtocean_core.core - Resetting to level modules initial

Check the status of the inputs

  • satisfied - data is in the data state
  • required - data is not in the data state
  • unavailable - data will come from another source

In [17]:
input_status = hydro_branch.get_input_status(new_core, new_project)
message = html_dict(input_status)
HTML(message)


Out[17]:
  • options.user_array_option: required
  • project.rated_power: required
  • device.power_rating: required
  • device.bidirection: required
  • farm.nogo_areas: optional
  • device.turbine_hub_height: required
  • farm.tidal_series: required
  • device.turbine_interdistance: optional
  • device.installation_depth_max: required
  • farm.tidal_occurrence_point: required
  • device.yaw: required
  • device.system_type: satisfied
  • device.turbine_performance: required
  • options.tidal_data_directory: optional
  • device.cut_in_velocity: required
  • farm.blockage_ratio: required
  • options.user_array_layout: optional
  • device.turbine_diameter: required
  • options.power_bin_width: optional
  • bathymetry.layers: required
  • device.minimum_distance_x: required
  • device.minimum_distance_y: required
  • options.boundary_padding: optional
  • corridor.landing_point: required
  • device.installation_depth_min: required
  • project.tidal_occurrence_nbins: required
  • bathymetry.mannings: required
  • device.cut_out_velocity: required
  • project.main_direction: optional
  • site.lease_boundary: required
  • options.optimisation_threshold: required

Read TableData Structure (device.turbine_performance)

Load a TableData Structure from a file. Note, device.turbine_performance is a LineTable structure, but it is a subclass of TableData so will utilise the same AutoFileInput / AutoFileOutput methods.

Only csv files are supported at the moment.


In [18]:
new_var_id = "device.turbine_performance"
new_var = hydro_branch.get_input_variable(new_core, new_project, new_var_id)

In [19]:
new_var.get_file_input_interfaces(new_core, include_auto=True)


Out[19]:
{'device.turbine_performance AutoFileInput Interface': ['.csv',
  '.xls',
  '.xlsx']}

In [21]:
new_var.read_file(new_core,
                  new_project,
                  "test_data/tidal_performance.csv")


2019-03-09 10:58:20,809 - INFO - aneris.control.data - New "device.turbine_performance" data stored with index ZLNGKN
2019-03-09 10:58:20,812 - INFO - aneris.control.simulation - Datastate stored
2019-03-09 10:58:20,812 - INFO - dtocean_core.core - Data added for identifier 'device.turbine_performance'

Recheck the status


In [22]:
input_status = hydro_branch.get_input_status(new_core, new_project)
var_status = {new_var_id: input_status[new_var_id]}
message = html_dict(var_status)
HTML(message)


Out[22]:
  • device.turbine_performance: satisfied

Examine the Data


In [23]:
new_var.get_value(new_core, new_project)


Out[23]:
Coefficient of Power Coefficient of Thrust
Velocity
0 0.0 0.0
1 0.2 0.1
2 0.4 0.2
3 0.6 0.4
4 0.8 0.8
5 1.0 1.6

Write TableData Structure (device.turbine_performance)

We can also do the reverse process of writing the data to a file. Again, only CSV file are supported.


In [24]:
new_var.write_file(new_core,
                   new_project,
                   "tidal_performance_copy.csv")

TimeTable

Fake an AutoFileInput interface for this. First build the necessary metadata (see tests/test_structures.py)


In [25]:
meta = CoreMetaData({"identifier": "test",
                     "structure": "test",
                     "title": "test",
                     "labels": ["a", "b"],
                     "units": ["kg", None]})

Create the Structure object and get the processed data format


In [26]:
data_obj = new_core.control.get_structure("TimeTable")

Build the fake AutoFileInput interface


In [27]:
interface_factory = InterfaceFactory(AutoFileInput)
AutoCls = interface_factory(meta, data_obj)
test = AutoCls()

Add the file path to the interface's ._path attribute and then call the AutoFileInput connect method


In [30]:
test._path = "test_data/date_test.csv"

In [31]:
test.connect()

Examine the data in the interface


In [32]:
test.data.result


Out[32]:
DateTime a b
0 1970-01-01 00:00:00.000000000 0.0 0.0
1 1970-01-01 00:00:00.000000001 0.2 0.1
2 1970-01-01 00:00:00.000000002 0.4 0.2
3 1970-01-01 00:00:00.000000003 0.6 0.4
4 1970-01-01 00:00:00.000000004 0.8 0.8
5 1970-01-01 00:00:00.000000005 1.0 1.6

Try to load the result into the Structure


In [33]:
data_value = data_obj.get_data(test.data.result, meta)
data_value


Out[33]:
a b
DateTime
1970-01-01 00:00:00.000000000 0.0 0.0
1970-01-01 00:00:00.000000001 0.2 0.1
1970-01-01 00:00:00.000000002 0.4 0.2
1970-01-01 00:00:00.000000003 0.6 0.4
1970-01-01 00:00:00.000000004 0.8 0.8
1970-01-01 00:00:00.000000005 1.0 1.6

Build the fake AutoFileOutput interface


In [34]:
interface_factory = InterfaceFactory(AutoFileOutput)
AutoCls = interface_factory(meta, data_obj)
test = AutoCls()

Add the file path to the interface's ._path attribute and then call the AutoFileInput connect method


In [35]:
test._path = "date_test_copy.csv"

In [36]:
test.put_data("test", data_value)

In [37]:
test.connect()

In [ ]: